Jensen-Shannon Divergence of Two Eddy Current Distributions Induced by Circular and Fractal Koch Excitation Coils

نویسندگان

چکیده

Eddy current distribution is important to the performance of planar eddy probes. In this paper, Jensen-Shannon divergences tangential intersection angle spectrum and radial direction energy were proposed evaluate difference between distributions generated by circular fractal Koch excitation coils. By simulation for shape coils, it works out that two kinds coils becomes larger with an increase in values divergences. At same time, correlation change divergence detectability short crack special was discussed through experiment results. It found that, relative 0° direction, differential pickup probes 90° has a spectrum. The width each signal

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-parametric Jensen-Shannon Divergence

Quantifying the difference between two distributions is a common problem in many machine learning and data mining tasks. What is also common in many tasks is that we only have empirical data. That is, we do not know the true distributions nor their form, and hence, before we can measure their divergence we first need to assume a distribution or perform estimation. For exploratory purposes this ...

متن کامل

Nonextensive Generalizations of the Jensen-Shannon Divergence

Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building bloc...

متن کامل

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

We present a lower bound on the Jensen-Shannon divergence by the Jeffrers’ divergence when pi ≥ qi is satisfied. In the original Lin's paper [IEEE Trans. Info. Theory, 37, 145 (1991)], where the divergence was introduced, the upper bound in terms of the Jeffreys was the quarter of it. In view of a recent shaper one reported by Crooks, we present a discussion on upper bounds by transcendental fu...

متن کامل

Properties of Classical and Quantum Jensen-Shannon Divergence

Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergence measure of information theory, Kullback divergence. As opposed to Kullback divergence it determines in a very direct way a metric; indeed, it is the square of a metric. We consider a family of divergence measures (JDα for α > 0), the Jensen divergences of order α, which generalize JD as JD1 = J...

متن کامل

Manifold Learning and the Quantum Jensen-Shannon Divergence Kernel

The quantum Jensen-Shannon divergence kernel [1] was recently introduced in the context of unattributed graphs where it was shown to outperform several commonly used alternatives. In this paper, we study the separability properties of this kernel and we propose a way to compute a low-dimensional kernel embedding where the separation of the different classes is enhanced. The idea stems from the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International journal of engineering. Transactions A: basics

سال: 2022

ISSN: ['1728-1431']

DOI: https://doi.org/10.5829/ije.2022.35.07a.12